22 research outputs found

    Usability Inspection in Model-Driven Web Development: Empirical Validation in WebML

    Full text link
    There is a lack of empirically validated usability evaluation methods that can be applied to models in model-driven Web development. Evaluation of these models allows an early detection of usability problems perceived by the end-user. This motivated us to propose WUEP, a usability inspection method which can be integrated into different model-driven Web development processes. We previously demonstrated how WUEP can effectively be used when following the Object-Oriented Hypermedia method. In order to provide evidences about WUEP’s generalizability, this paper presents the operationalization and empirical validation of WUEP into another well-known method: WebML. The effectiveness, efficiency, perceived ease of use, and satisfaction of WUEP were evaluated in comparison to Heuristic Evaluation (HE) from the viewpoint of novice inspectors. The results show that WUEP was more effective and efficient than HE when detecting usability problems on models. Also, inspectors were satisfied when applying WUEP, and found it easier to use than HE.FernĂĄndez MartĂ­nez, A.; Abrahao Gonzales, SM.; InsfrĂĄn Pelozo, CE.; Matera, M. (2013). Usability Inspection in Model-Driven Web Development: Empirical Validation in WebML. Lecture Notes in Computer Science. 8107:740-756. doi:10.1007/978-3-642-41533-3_457407568107AbrahĂŁo, S., Iborra, E., Vanderdonckt, J.: Usability Evaluation of User Interfaces Generated with a Model-Driven Architecture Tool. In: Maturing Usability: Quality in Software, Interaction and Value, pp. 3–32. Springer (2007)Atterer, R., Schmidt, A.: Adding Usability to Web Engineering Models and Tools. In: Lowe, D.G., Gaedke, M. (eds.) ICWE 2005. LNCS, vol. 3579, pp. 36–41. Springer, Heidelberg (2005)Basili, V., Rombach, H.: The TAME Project: Towards Improvement-Oriented Software Environments. IEEE Transactions on Software Engineering 14(6), 758–773 (1988)Briand, L., Labiche, Y., Di Penta, M., Yan-Bondoc, H.: An experimental investigation of formality in UML-based development. IEEE TSE 31(10), 833–849 (2005)Carifio, J., Perla, R.: Ten Common Misunderstandings, Misconceptions, Persistent Myths and Urban Legends about Likert Scales and Likert Response Formats and their Antidotes. Journal of Social Sciences 3(3), 106–116 (2007)Ceri, S., Fraternali, P., Bongio, A.: Web modeling language (WebML): a modeling language for designing Web sites. In: 9th International World Wide Web Conference, pp. 137–157 (2000)Ceri, S., Fraternali, P., Acerbis, R., Bongio, A., Butti, S., Ciapessoni, F., Conserva, C., Elli, R., Greppi, C., Tagliasacchi, M., Toffetti, G.: Architectural issues and solutions in the development of data-intensive Web applications. In: Proceedings of the 1st Biennial Conference on Innovative Data Systems Research, Asilomar, CA (2003)Conte, T., Massollar, J., Mendes, E., Travassos, G.H.: Usability Evaluation Based on Web Design Perspectives. In: Proceedings of the International Symposium on Empirical Software Engineering and Measurement (ESEM 2007), pp. 146–155 (2007)Fernandez, A., Insfran, E., AbrahĂŁo, S.: Usability evaluation methods for the Web: a systematic mapping study. Information and Software Technology 53, 789–817 (2011)Fernandez, A., AbrahĂŁo, S., Insfran, E.: A Web usability evaluation process for model-driven Web development. In: Mouratidis, H., Rolland, C. (eds.) CAiSE 2011. LNCS, vol. 6741, pp. 108–122. Springer, Heidelberg (2011)Fernandez, A., AbrahĂŁo, S., Insfran, E., Matera, M.: Further Analysis on the Validation of a Usability Inspection Method for Model-Driven Web Development. In: 6th International Symposium on Empirical Software Engineering and Measurement (ESEM 2012), pp. 153–156 (2012)Fernandez, A., AbrahĂŁo, S., Insfran, E.: Empirical Validation of a Usability Inspection Method for Model-Driven Web Development. Journal of Systems and Software 86, 161–186 (2013)Fraternali, P., Matera, M., Maurino, A.: WQA: an XSL Framework for Analyzing the Quality of Web Applications. In: Proceedings of IWWOST 2002 - ECOOP 2002 Workshop, Malaga, Spain (2002)HornbĂŠk, K.: Dogmas in the assessment of usability evaluation methods. Behaviour & Information Technology 29(1), 97–111 (2010)Hwang, W., Salvendy, G.: Number of people required for usability evaluation: the 10±2 rule. Communications of the ACM 53(5), 130–113 (2010)International Organization for Standardization: ISO/IEC 25000, Software Engineering – Software Product Quality Requirements and Evaluation (SQuaRE) – Guide to SQuaRE (2005)Juristo, N., Moreno, A.M.: Basics of Software Engineering Experimentation. Kluwer Academic Publishers (2001)Juristo, N., Moreno, A., Sanchez-Segura, M.I.: Guidelines for eliciting usability functionalities. IEEE Transactions on Software Engineering 33(11), 744–758 (2007)Matera, M., Costabile, M.F., Garzotto, F., Paolini, P.: SUE inspection: an effective method for systematic usability evaluation of hypermedia. IEEE Transactions on Systems, Man, and Cybernetics, Part A 32(1), 93–103 (2002)Matera, M., Rizzo, F., Carughi, G.: Web Usability: Principles and Evaluation Methods. In: Web Engineering, pp. 143–180. Springer (2006)Maxwell, K.: Applied Statistics for Software Managers. Software Quality Institute Series. Prentice Hall (2002)Molina, F., Toval, A.: Integrating usability requirements that can be evaluated in design time into Model Driven Engineering of Web Information Systems. Advances in Engineering Software 40(12), 1306–1317 (2009)Moreno, N., Vallecillo, A.: Towards interoperable Web engineering methods. Journal of the American Society for Information Science and Technolog 59(7), 1073–1092 (2008)Neuwirth, C.M., Regli, S.H.: IEEE Internet Computing Special Issue on Usability and the Web 6(2) (2002)Nielsen, J.: Heuristic evaluation. In: Usability Inspection Methods. John Wiley & Sons, NY (1994)Offutt, J.: Quality attributes of Web software applications. IEEE Software: Special Issue on Software Engineering of Internet Software, 25–32 (2002)Panach, I., Condori, N., Valverde, F., Aquino, N., Pastor, O.: Understandability measurement in an early usability evaluation for MDD. In: International Symposium on Empirical Software Engineering (ESEM 2008), pp. 354–356 (2008)Webratio. Success stories, Online article, http://www.webratio.com/portal/content/en/success-storiesWohlin, C., Runeson, P., Host, M., Ohlsson, M.C., Regnell, B., Weslen, A.: Experimentation in Software Engineering - An Introduction. Kluwer (2000

    Can filesharers be triggered by economic incentives? Results of an experiment

    Get PDF
    Illegal filesharing on the internet leads to considerable financial losses for artists and copyright owners as well as producers and sellers of music. Thus far, measures to contain this phenomenon have been rather restrictive. However, there are still a considerable number of illegal systems, and users are able to decide quite freely between legal and illegal downloads because the latter are still difficult to sanction. Recent economic approaches account for the improved bargaining position of users. They are based on the idea of revenue-splitting between professional sellers and peers. In order to test such an innovative business model, the study reported in this article carried out an experiment with 100 undergraduate students, forming five small peer-to-peer networks.The networks were confronted with different economic conditions.The results indicate that even experienced filesharers hold favourable attitudes towards revenue-splitting.They seem to be willing to adjust their behaviour to different economic conditions

    A Tool for Detecting Bad Usability Smells in an Automatic Way

    No full text

    Tandem Browsing Toolkit

    No full text

    Flexible Reporting for Automated Usability and Accessibility Evaluation of Web Sites

    No full text
    A system for automatically evaluating the usability and accessibility of web sites by checking their HTML code against guidelines has been developed. All usability and accessibility guidelines are formally expressed in a XML-compliant specification language called Guideline Definition Language (GDL) so as to separate the evaluation engine from the evaluation logics (the guidelines). This separation enables managing guidelines (i.e., create, retrieve, update, and delete) without affecting the code of the evaluation engine. The evaluation engine is coupled to a reporting system that automatically generates one or many evaluation reports in a flexible way: adaptation for screen reading or for a printed report, sorting by page, by object, by guideline, by priority, or by severity of the detected problems. This paper focuses on the reporting system
    corecore